Add max_context_length to TextEncode node for LLM max tokens - experimental use #289
+963
−26
We went looking everywhere, but couldn’t find those commits.
Sometimes commits can disappear after a force-push. Head back to the latest changes here.